Lower Bounds for Oblivious Subspace Embeddings
نویسندگان
چکیده
An oblivious subspace embedding (OSE) for some ε, δ ∈ (0, 1/3) and d ≤ m ≤ n is a distribution D over Rm×n such that for any linear subspace W ⊂ Rn of dimension d, P Π∼D (∀x ∈W, (1− ε)‖x‖2 ≤ ‖Πx‖2 ≤ (1 + ε)‖x‖2) ≥ 1− δ. We prove that any OSE with δ < 1/3 must have m = Ω((d + log(1/δ))/ε2), which is optimal. Furthermore, if every Π in the support of D is sparse, having at most s non-zero entries per column, then we show tradeoff lower bounds between m and s.
منابع مشابه
Nearly Tight Oblivious Subspace Embeddings by Trace Inequalities
We present a new analysis of sparse oblivious subspace embeddings, based on the ”matrix Chernoff” technique. These are probability distributions over (relatively) sparse matrices such that for any d-dimensional subspace of R, the norms of all vectors in the subspace are simultaneously approximately preserved by the embedding with high probability–typically with parameters depending on d but not...
متن کاملTight Bounds for $\ell_p$ Oblivious Subspace Embeddings
An lp oblivious subspace embedding is a distribution over r × n matrices Π such that for any fixed n× d matrix A, Pr Π [for all x, ‖Ax‖p ≤ ‖ΠAx‖p ≤ κ‖Ax‖p] ≥ 9/10, where r is the dimension of the embedding, κ is the distortion of the embedding, and for an n-dimensional vector y, ‖y‖p = ( ∑n i=1 |yi|) 1/p is the lp-norm. Another important property is the sparsity of Π, that is, the maximum numbe...
متن کاملSubspace Embeddings and ℓp-Regression Using Exponential Random Variables
Oblivious low-distortion subspace embeddings are a crucial building block for numerical linear algebra problems. We show for any real p, 1 ≤ p <∞, given a matrix M ∈ Rn×d with n d, with constant probability we can choose a matrix Π with max(1, n1−2/p)poly(d) rows and n columns so that simultaneously for all x ∈ R, ‖Mx‖p ≤ ‖ΠMx‖∞ ≤ poly(d)‖Mx‖p. Importantly, ΠM can be computed in the optimal O(n...
متن کاملSubspace Embeddings and \(\ell_p\)-Regression Using Exponential Random Variables
Oblivious low-distortion subspace embeddings are a crucial building block for numerical linear algebra problems. We show for any real p, 1 ≤ p < ∞, given a matrix M ∈ R with n ≫ d, with constant probability we can choose a matrix Π with max(1, n)poly(d) rows and n columns so that simultaneously for all x ∈ R, ‖Mx‖p ≤ ‖ΠMx‖∞ ≤ poly(d)‖Mx‖p. Importantly, ΠM can be computed in the optimal O(nnz(M)...
متن کاملSubspace Embeddings for the Polynomial Kernel
Sketching is a powerful dimensionality reduction tool for accelerating statistical learning algorithms. However, its applicability has been limited to a certain extent since the crucial ingredient, the so-called oblivious subspace embedding, can only be applied to data spaces with an explicit representation as the column span or row span of a matrix, while in many settings learning is done in a...
متن کامل